Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Multi-view learning is a rapidly evolving research area focused on developing diverse learning representations. In neural data analysis, this approach holds immense potential by capturing spatial, temporal, and frequency features. Despite its promise, multi-view application to functional near-infrared spectroscopy (fNIRS) has remained largely unexplored. This study addresses this gap by introducing fNIRSNET, a novel framework that generates and fuses multi-view spatio-temporal representations using convolutional neural networks. It investigates the combined informational strength of oxygenated (HbO2) and deoxygenated (HbR) hemoglobin signals, further extending these capabilities by integrating with electroencephalography (EEG) networks to achieve robust multimodal classification. Experiments involved classifying neural responses to auditory stimuli with nine healthy participants. fNIRS signals were decomposed into HbO2/HbR concentration changes, resulting in Parallel and Merged input types. We evaluated four input types across three data compositions: balanced, subject, and complete datasets. Our fNIRSNET's performance was compared with eight baseline classification models and merged it with four common EEG networks to assess the efficacy of combined features for multimodal classification. Compared to baselines, fNIRSNET using the Merged input type achieved the highest accuracy of 83.22%, 81.18%, and 91.58% for balanced, subject, and complete datasets, respectively. In the complete set, the approach effectively mitigated class imbalance issues, achieving sensitivity of 83.58% and specificity of 95.42%. Multimodal fusion of EEG networks and fNIRSNET outperformed single-modality performance with the highest accuracy of 87.15% on balanced data. Overall, this study introduces an innovative fusion approach for decoding fNIRS data and illustrates its integration with established EEG networks to enhance performance.more » « less
-
Free, publicly-accessible full text available September 1, 2026
-
Free, publicly-accessible full text available September 1, 2026
-
We present the first threefold differential measurement for neutral-pion multiplicity ratios produced in semi-inclusive deep-inelastic electron scattering on carbon, iron, and lead nuclei normalized to deuterium from CLAS at Jefferson Lab. We found that the neutral-pion multiplicity ratio is maximally suppressed for the leading hadrons (energy fraction 1), suppression varying from 25% in carbon up to 75% in lead. An enhancement of the multiplicity ratio at low and high is observed, suggesting an interconnection between these two variables. This behavior is qualitatively similar to the previous twofold differential measurement of charged pions by the HERMES Collaboration and, recently, by CLAS Collaboration. The largest enhancement was observed at high for heavier nuclei, namely, iron and lead, while the smallest enhancement was observed for the lightest nucleus, carbon. This behavior suggests a competition between partonic multiple scattering, which causes enhancement, and hadronic inelastic scattering, which causes suppression.more » « lessFree, publicly-accessible full text available September 1, 2026
-
Given an input stream of size N , a -heavy hiter is an item that occurs at least N times in S. The problem of finding heavy-hitters is extensively studied in the database literature. We study a real-time heavy-hitters variant in which an element must be reported shortly after we see its T = N - th occurrence (and hence becomes a heavy hitter). We call this the Timely Event Detection (TED) Problem. The TED problem models the needs of many real-world monitoring systems, which demand accurate (i.e., no false negatives) and timely reporting of all events from large, high-speed streams, and with a low reporting threshold (high sensitivity). Like the classic heavy-hitters problem, solving the TED problem without false-positives requires large space ((N ) words). Thus in-RAM heavy-hitters algorithms typically sacrfice accuracy (i.e., allow false positives), sensitivity, or timeliness (i.e., use multiple passes). We show how to adapt heavy-hitters algorithms to exter- nal memory to solve the TED problem on large high-speed streams while guaranteeing accuracy, sensitivity, and timeli- ness. Our data structures are limited only by I/O-bandwidth (not latency) and support a tunable trade-off between report- ing delay and I/O overhead. With a small bounded reporting delay, our algorithms incur only a logarithmic I/O overhead. We implement and validate our data structures empirically using the Firehose streaming benchmark. Multi-threaded ver- sions of our structures can scale to process 11M observations per second before becoming CPU bound. In comparison, a naive adaptation of the standard heavy-hitters algorithm to external memory would be limited by the storage device’s random I/O throughput, i.e., approx 100K observations per second.more » « less
-
Measurements of the polarization observables for the reaction using a linearly polarized photon beam of energy 1.1 to 2.1 GeV are reported. The measured data provide information on a channel that has not been studied extensively, but is required for a full coupled-channel analysis in the nucleon resonance region. Observables have been simultaneously extracted using likelihood sampling with a Markov-Chain Monte Carlo process. Angular distributions in bins of photon energy are produced for each polarization observable. , and are first time measurements of these observables in this reaction. The extraction of extends the energy range beyond a previous measurement. The measurement of , the recoil polarization, is consistent with previous measurements. The measured data are shown to be significant enough to affect the estimation of the nucleon resonance parameters when fitted within a coupled-channels model. Published by the American Physical Society2025more » « lessFree, publicly-accessible full text available February 1, 2026
-
Measuring deeply virtual Compton scattering (DVCS) on the neutron is one of the necessary steps to understand the structure of the nucleon in terms of generalized parton distributions (GPDs). Neutron targets play a complementary role to transversely polarized proton targets in the determination of the GPD . This poorly known and poorly constrained GPD is essential to obtain the contribution of the quarks’ angular momentum to the spin of the nucleon. DVCS on the neutron was measured for the first time selecting the exclusive final state by detecting the neutron, using the Jefferson Lab longitudinally polarized electron beam, with energies up to 10.6 GeV, and the CLAS12 detector. The extracted beam-spin asymmetries, combined with DVCS observables measured on the proton, allow a clean quark-flavor separation of the imaginary parts of the Compton form factors and . Published by the American Physical Society2024more » « less
An official website of the United States government

Full Text Available